Xbox Introduces System-Wide Feature for Reporting Abusive Voice Chat
Microsoft is taking further steps to address the issue of toxicity in multiplayer Xbox games. The company is implementing a new functionality that enables players on Xbox Series X/S and Xbox One to record a 60-second video clip of offensive or unsuitable voice chats and send them for evaluation by moderators.
“This feature is designed to support the widest range of in-game interactions between players and works in thousands of games that offer in-game multiplayer voice chat, including Xbox 360 backwards compatible titles,” Dave McCarthy, vice president of Xbox Player Services, wrote in a blog post.
Microsoft designed the tool both for ease of use and to minimize its impact on gameplay. When you save a clip for reporting, it stays on your Xbox for “24 online hours”. You can send it immediately or wait until you end the game session. You will receive a reminder before 24 hours have passed. If you choose not to report a clip, Xbox will automatically delete it.
No one else can access the clip unless you send it. “Xbox does not record or download audio clips without you, the player, choosing to initiate the reporting process,” McCarthy said. Clips you save with the tool won’t appear in your recent lockers and can’t be uploaded, shared, or edited. These clips are used for moderation purposes only. Once the security team has reviewed your report, you will receive a notification to let you know if they have taken action against the offending player.
An Xbox spokesperson told ReturnByte that the security team uses “a set of moderation tools that leverage artificial intelligence and human moderators” to analyze the clips. Moderators review audio and video to determine if anyone has violated Community Standards.
At the beginning, the reactive voice reporting system allows the player to report up to three people at once. “If a Moderator is unable to determine who was speaking at a given time and match it to the reported Xbox Live player, the report will be closed as inadmissible, no enforcement action will be taken, and the captured video will be deleted within 90 days,” the spokesperson said.
The arrival of a multi-level game changes the waters here as well. The security team will not take action if there is inappropriate voice chat from people on other platforms. “The announced reactive voice moderation feature is specifically for reporting Xbox players to the Xbox Safety Team,” the spokesperson noted.
It’s encouraging to see Xbox addressing the issue of toxic voice chat at a platform-wide level. The PlayStation 5 has had a similar feature since 2020 when it debuted.
Several studios have adopted similar approaches in their own games. In 2021, Riot said it will record Valorant voice communications, but only listen to them when an announcement is made. It began testing the system last July.
Before Overwatch 2 went live last October, Blizzard said it automatically transcribed a recording of the match’s voice chat after a player reported. The company’s chat review tools analyze the transcript for signs of abuse, and both the recording and the text file are deleted. (Note that Blizzard and Overwatch 2 may soon be owned by Microsoft.)
Initially, the Xbox voice reporting feature will be available to Alpha and Alpha-skip Xbox Insiders in English-speaking markets in the US, Canada, UK, Ireland, Australia and New Zealand. Microsoft hopes insiders will provide feedback to improve this feature. It plans to continue investing in voice moderation and support more languages. Xbox shares information and updates on voice chat moderation in its bi-annual transparency report.